2024-03-19 17:51:53,961 [ 90024 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy (runner:41, check_args_and_update_paths) 2024-03-19 17:51:53,962 [ 90024 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:89, check_args_and_update_paths) 2024-03-19 17:51:53,962 [ 90024 ] INFO : src dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/src (runner:96, check_args_and_update_paths) 2024-03-19 17:51:53,962 [ 90024 ] INFO : base_configs_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:98, check_args_and_update_paths) clickhouse_integration_tests_volume WARNING: Ignoring custom format, because both --format and --quiet are set. Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_u76c73 --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=2 --color=no --durations=0 test_cgroup_limit/test.py::test_cgroup_cpu_limit -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d '. Start tests ============================= test session starts ============================== platform linux -- Python 3.8.10, pytest-8.0.2, pluggy-1.4.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: repeat-0.9.3, xdist-3.5.0, random-0.2, timeout-2.2.0, order-1.0.0 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [1 item] scheduling tests via LoadFileScheduling test_cgroup_limit/test.py::test_cgroup_cpu_limit [gw0] [100%] FAILED test_cgroup_limit/test.py::test_cgroup_cpu_limit =================================== FAILURES =================================== ____________________________ test_cgroup_cpu_limit _____________________________ [gw0] linux -- Python 3.8.10 /usr/bin/python3 def test_cgroup_cpu_limit(): for num_cpus in (1, 2, 4, 2.8): > result = run_with_cpu_limit( "clickhouse local -q \"select value from system.settings where name='max_threads'\"", num_cpus, ) test_cgroup_limit/test.py:43: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_cgroup_limit/test.py:38: in run_with_cpu_limit return run_command_in_container(cmd, *args) test_cgroup_limit/test.py:19: in run_command_in_container return subprocess.check_output( /usr/lib/python3.8/subprocess.py:415: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = None, check = True popenargs = (['docker', 'run', '--rm', '--cpus', '1', '--volume', ...],) kwargs = {'stdout': -1}, process = stdout = b'', stderr = None, retcode = 125 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout is given, and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['docker', 'run', '--rm', '--cpus', '1', '--volume', '/clickhouse:/usr/bin/clickhouse', 'ubuntu:20.04', 'sh', '-c', 'clickhouse local -q "select value from system.settings where name=\'max_threads\'"']' returned non-zero exit status 125. /usr/lib/python3.8/subprocess.py:516: CalledProcessError ------------------------------ Captured log setup ------------------------------ 2024-03-19 17:51:58 [ 365 ] DEBUG : Command:['docker ps | wc -l'] (cluster.py:97, run_and_check) 2024-03-19 17:51:58 [ 365 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-19 17:51:58 [ 365 ] DEBUG : No running containers (conftest.py:44, cleanup_environment) ----------------------------- Captured stderr call ----------------------------- docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: unable to apply cgroup configuration: cannot enter cgroupv2 "/sys/fs/cgroup/docker" with domain controllers -- it is in an invalid state: unknown. ============================== slowest durations =============================== 0.16s call test_cgroup_limit/test.py::test_cgroup_cpu_limit 0.02s setup test_cgroup_limit/test.py::test_cgroup_cpu_limit 0.00s teardown test_cgroup_limit/test.py::test_cgroup_cpu_limit =========================== short test summary info ============================ FAILED test_cgroup_limit/test.py::test_cgroup_cpu_limit - subprocess.CalledPr... ============================== 1 failed in 1.50s =============================== Traceback (most recent call last): File "/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration/./runner", line 448, in subprocess.check_call(cmd, shell=True) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_u76c73 --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=2 --color=no --durations=0 test_cgroup_limit/test.py::test_cgroup_cpu_limit -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d ' returned non-zero exit status 1.